5,470 research outputs found

    Design and Analysis of a Task-based Parallelization over a Runtime System of an Explicit Finite-Volume CFD Code with Adaptive Time Stepping

    Get PDF
    FLUSEPA (Registered trademark in France No. 134009261) is an advanced simulation tool which performs a large panel of aerodynamic studies. It is the unstructured finite-volume solver developed by Airbus Safran Launchers company to calculate compressible, multidimensional, unsteady, viscous and reactive flows around bodies in relative motion. The time integration in FLUSEPA is done using an explicit temporal adaptive method. The current production version of the code is based on MPI and OpenMP. This implementation leads to important synchronizations that must be reduced. To tackle this problem, we present the study of a task-based parallelization of the aerodynamic solver of FLUSEPA using the runtime system StarPU and combining up to three levels of parallelism. We validate our solution by the simulation (using a finite-volume mesh with 80 million cells) of a take-off blast wave propagation for Ariane 5 launcher.Comment: Accepted manuscript of a paper in Journal of Computational Scienc

    Improving MCMC Using Efficient Importance Sampling

    Get PDF
    This paper develops a systematic Markov Chain Monte Carlo (MCMC) framework based upon Efficient Importance Sampling (EIS) which can be used for the analysis of a wide range of econometric models involving integrals without an analytical solution. EIS is a simple, generic and yet accurate Monte-Carlo integration procedure based on sampling densities which are chosen to be global approximations to the integrand. By embedding EIS within MCMC procedures based on Metropolis-Hastings (MH) one can significantly improve their numerical properties, essentially by providing a fully automated selection of critical MCMC components such as auxiliary sampling densities, normalizing constants and starting values. The potential of this integrated MCMC- EIS approach is illustrated with simple univariate integration problems and with the Bayesian posterior analysis of stochastic volatility models and stationary autoregressive processes. --Autoregressive models,Bayesian posterior analysis,Dynamic latent variables,Gibbs sampling,Metropolis Hastings,Stochastic volatility

    Classical and Bayesian Analysis of Univariate and Multivariate Stochastic Volatility Models

    Get PDF
    In this paper, Efficient Importance Sampling (EIS) is used to perform a classical and Bayesian analysis of univariate and multivariate Stochastic Volatility (SV) models for financial return series. EIS provides a highly generic and very accurate procedure for the Monte Carlo (MC) evaluation of high-dimensional interdependent integrals. It can be used to carry out ML-estimation of SV models as well as simulation smoothing where the latent volatilities are sampled at once. Based on this EIS simulation smoother a Bayesian Markov Chain Monte Carlo (MCDC) posterior analysis of the parameters of SV models can be performed.

    The Multinomial Multiperiod Probit Model: Identification and Efficient Estimation

    Get PDF
    In this paper we discuss parameter identification and likelihood evaluation for multinomial multiperiod Probit models. It is shown in particular that the standard autoregressive specification used in the literature can be interpreted as a latent common factor model. However, this specification is not invariant with respect to the selection of the baseline category. Hence, we propose an alternative specification which is invariant with respect to such a selection and identifies coefficients characterizing the stationary covariance matrix which are not identified in the standard approach. For likelihood evaluation requiring high-dimensional truncated integration we propose to use a generic procedure known as Efficient Importance Sampling (EIS). A special case of our proposed EIS algorithm is the standard GHK probability simulator. To illustrate the relative performance of both procedures we perform a set Monte-Carlo experiments. Our results indicate substantial numerical e?ciency gains of the ML estimates based on GHK-EIS relative to ML estimates obtained by using GHK. --Discrete choice,Importance sampling,Monte-Carlo integration,Panel data,Parameter identification,Simulated maximum likelihood

    Classical and Bayesian Analysis of Univariate and Multivariate Stochastic Volatility Models

    Get PDF
    In this paper Efficient Importance Sampling (EIS) is used to perform a classical and Bayesian analysis of univariate and multivariate Stochastic Volatility (SV) models for financial return series. EIS provides a highly generic and very accurate procedure for the Monte Carlo (MC) evaluation of high-dimensional interdependent integrals. It can be used to carry out ML-estimation of SV models as well as simulation smoothing where the latent volatilities are sampled at once. Based on this EIS simulation smoother a Bayesian Markov Chain Monte Carlo (MCMC) posterior analysis of the parameters of SV models can be performed. --Dynamic Latent Variables,Markov Chain Monte Carlo,Maximum likelihood,Simulation Smoother

    Integrated transfers of terrigenous organic matter to lakes at their watershed level: A combined biomarker and GIS analysis

    Get PDF
    Terrigenous organic matter (TOM) transfer from a watershed to a lake plays a key role in contaminants fate and greenhouse gazes emission in these aquatic ecosystems. In this study, we linked physiographic and vegetation characteristics of a watershed with TOM nature deposited in lake sediments. TOM was characterized using lignin biomarkers as indicators of TOM sources and state of degradation. Geographical information system (GIS) also allowed us to integrate and describe the landscape morpho-edaphic characteristics of a defined drainage basin. Combining these tools we found a significant and positive relationship (R2 = 0.65, p < 0.002) between mean slope of the watershed and the terrigenous fraction estimated by Λ8 in recent sediments. The mean slope also correlated with the composition of TOM in recent sediments as P/(V + S) and 3,5Bd/V ratios significantly decreased with the steepness of the watersheds (R2 = 0.57, p < 0.021 and R2 = 0.71, p < 0.004, respectively). More precisely, areas with slopes comprised between 4° and 10° have a major influence on TOM inputs to lakes. The vegetation composition of each watershed influenced the composition of recent sediments of the sampled lakes. The increasing presence of angiosperm trees in the watershed influenced the export of TOM to the lake as Λ8 increased significantly with the presence of this type of vegetation (R2 = 0.44, p < 0.019). A similar relationship was also observed with S/V ratios, an indicator of angiosperm sources for TOM. The type of vegetation also greatly influenced the degradation state of OM. In this study, we were able to determine that low-sloped areas (0–2°) act as buffer zones for lignin inputs and by extension for TOM loading to sediments. The relative contribution of TOM from the soil organic horizons also increased in steeper watersheds. This study has significant implications in our understanding of the fate of TOM in lacustrine ecosystems

    Dynamic Factor Models for Multivariate Count Data: An Application to Stock-Market Trading Activity

    Get PDF
    We propose a dynamic factor model for the analysis of multivariate time series count data. Our model allows for idiosyncratic as well as common serially correlated latent factors in order to account for potentially complex dynamic interdependence between series of counts. The model is estimated under alternative count distributions (Poisson and negative binomial). Maximum Likelihood estimation requires high-dimensional numerical integration in order to marginalize the joint distribution with respect to the unobserved dynamic factors. We rely upon the Monte-Carlo integration procedure known as Efficient Importance Sampling which produces fast and numerically accurate estimates of the likelihood function. The model is applied to time series data consisting of numbers of trades in 5 minutes intervals for five NYSE stocks from two industrial sectors. The estimated model accounts for all key dynamic and distributional features of the data. We find strong evidence of a common factor which we interpret as reflecting market-wide news. In contrast, sector-specific factors are found to be statistically insignifficant. --Dynamic latent variables,Importance sampling,Mixture of distribution models,Poisson distribution,Simulated Maximum Likelihood

    Bayesian Analysis of a Probit Panel Data Model with Unobserved Individual Heterogeneity and Autocorrelated Errors

    Get PDF
    In this paper, we perform Bayesian analysis of a panel probit model with unobserved individual heterogeneity and serially correlated errors. We augment the data with latent variables and sample the unobserved heterogeneity component as one Gibbs block per individual using a flexible piecewise linear approximation to the marginal posterior density. The latent time effects are simulated as another Gibbs block. For this purpose we develop a new user-friendly form of the Efficient Importance Sampling proposal density for an Acceptance-Rejection Metropolis-Hastings step. We apply our method to the analysis of product innovation activity of a panel of German manufacturing firms in response to imports, foreign direct investment and other control variables. The dataset used here was analyzed under more restrictive assumptions by Bertschek and Lechner (1998) and Greene (2004). Although our results differ to a certain degree from these benchmark studies, we confirm the positive effect of imports and FDI on firms' innovation activity. Moreover, unobserved firm heterogeneity is shown to play a far more significant role in the application than the latent time effects.Dynamic latent variables; Markov Chain Monte Carlo; importance sampling
    • …
    corecore